Pulse Deinterleaving for Multi-Function Radars with Hierarchical Deep Neural Networks
نویسندگان
چکیده
منابع مشابه
Deinterleaving Radar Pulse Trains Using Neural Networks
All known time of arrival (TOA) deinterleaving algorithms have great diiculty separating interleaved radar pulse trains when two or more of the trains are significantly jittered. Such scenarios tend to lead to overwhelming ambiguity and fragmentation diiculties when using traditional techniques. In this report we propose a modiied sequence search technique combined with a novel recursive neural...
متن کاملMulti-task Deep Neural Networks in Automated Protein Function Prediction
Background: In recent years, deep learning algorithms have outperformed the state-of-the art methods in several areas such as computer vision, speech recognition thanks to the efficient methods for training and for preventing overfitting, advancement in computer hardware and the availability of vast amount data. The high performance of multi-task deep neural networks in drug discovery has attra...
متن کاملLearning hierarchical categories in deep neural networks
A wide array of psychology experiments have revealed remarkable regularities in the developmental time course of human cognition. For example, infants generally acquire broad categorical distinctions (i.e., plant/animal) before finer-scale distinctions (i.e., dog/cat), often exhibiting rapid, or stage-like transitions. What are the theoretical principles underlying the ability of neuronal netwo...
متن کاملHierarchical Cloth Simulation using Deep Neural Networks
Fast and reliable physically-based simulation techniques are essential for providing flexible visual effects for computer graphics content. In this paper, we propose a fast and reliable hierarchical cloth simulation method, which combines conventional physically-based simulation with deep neural networks (DNN). Simulations of the coarsest level of the hierarchical model are calculated using con...
متن کاملWhy Deep Neural Networks for Function Approximation?
Recently there has been much interest in understanding why deep neural networks are preferred to shallow networks. We show that, for a large class of piecewise smooth functions, the number of neurons needed by a shallow network to approximate a function is exponentially larger than the corresponding number of neurons needed by a deep network for a given degree of function approximation. First, ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Aerospace and Electronic Systems
سال: 2021
ISSN: 0018-9251,1557-9603,2371-9877
DOI: 10.1109/taes.2021.3079571